In the past few years, there has been explosive growth in the cross-platform tools market. Both the number of customers using portability tools and the number of tools available have grown dramatically. On the surface, most of these tools perform a similar function. They provide a means of creating the various components of your application: the windows, controls, and menus. They also provide mechanisms to perform tasks such as drawing text and graphics, and they allow you to compile your code so that it runs with little or no changes on a variety of operating systems.
It is often clear that the quality and capability of the tools vary greatly, but it is not always as apparent that these tools vary in their underlying architecture. While understanding the architecture of some software tools is not critical -- for example knowing how a word processing program internally handles text is not necessarily important to using it -- understanding how a cross-platform toolkit is put together is fundamental to understanding both its short and long term capabilities. There are three general categories of implementation architecture for cross-platform toolkits: Emulation approaches (also known as Virtual Operating Systems), Least Common Denominator approaches, and the Hybrid Superset approach, of which zApp is an example.
Each of these implementation architectures varies in how it works with the standard operating systems. Each operating system has an underlying application programming interface (API) that provides facilities to perform tasks such as creating windows, menus, etc. For example, Microsoft Windows has the Microsoft Windows API, and Motif has three levels of API -- Motif, Xt, and Xlib. Using this standard native API to create windows, menus, and other basic visual components ensures a standard look and feel for the application because the code rendering those visual components is the standard set provided with the operating system. There are also a number of other important characteristics that stem from the cross-platform tool vendor's use, extension, or replacement of the standard native systems. The tool's implementation architecture determines its capabilities now and in the future, and has a major impact on the efficiency, extensibility, and maintainability of the software developed with the tool.
Utilizing the standard native API means that many changes in the underlying windowing system will be passed on to (or inherited by) the application built on top of the toolkit. This approach also provides easy integration with platform-specific APIs and code. For example, you can take a zApp window object and pass it directly to calls expecting a Windows window handle, or a Motif widget. While this does allow the user to create non-portable, platform-specific code, it provides much greater integration with existing legacy code.
The hybrid superset approach builds on the shoulders of the standard operting system vendors -- Microsoft, the OSF (and UNIX vendors), IBM, and Apple. It takes advantage of the standard windowing systems and utilizes the standard operating systems. This gives users optimum performance; minimizes the footprint of applications; provides clear and direct access to the true native look, feel, and facilities of the standard operating systems of today and tomorrow; and provides an immediate migration path to new versions of the standard operating systems.
LCD approaches initially appeared as a quick and relatively easy way to take advantage of the standard underlying APIs available on each platform. They provide true native look and feel, and they provide cross-platform support. While they do utilize the underlying system, they typically do so at a level that does not provide enough control for commercial grade applications. They also have glaring omissions from their feature set. For example, they may support Asian languages (double byte character sets), but lack the ability to create a specific font, which is necessary for displaying Asian languages when rendering graphics.
LCD toolkits are typically used when portability is the only concern, and when the limited functionality that the tool provides is adequate for the required task.
The emulation toolkit provider must re-implement much or all of what has been created by Microsoft, OSF (and the UNIX vendors), IBM, and Apple combined. This is especially true during those times when the user interface of a particular system changes, as is happening now with Microsoft Windows, for example. While applications that utilize the underlying system's API can frequently adopt the new system's look and feel without even re-compiling, applications written using an emulation library will not be able to have the new look and feel until the tool vendor re-writes their library, the tool user re-compiles their application and then distributes the new application to end users. This results in a technology lag as the emulation toolkit developer plays catch up with the system vendors.
Second, there is increased overhead with emulation libraries. Since part of the application includes code to emulate (and replace) facilities that the underlying system provides, programs written with emulation toolkits will always have greater overhead than those that utilize the underlying system.
Third, integration with platform-specific APIs is more difficult or impossible with certain toolkits, as the standard operating system facilities rely on the fact that the objects used by the program are the real thing, and not an imitation. This is especially important as the systems themselves are adding many new API features (such as OLE, OpenDoc, etc.) which you may want to take advantage of. Until and unless the emulation toolkit vendor reimplements these new facilities, the tool user cannot use the new facility (even by giving up portability).
There is also the risk that the emulation is not 100% accurate. The differences can be subtle, and depend entirely on how good a job the vendor does, but keep in mind, it's a big job. Emulation toolkits are frequently very large bodies of source code, sometimes over a million lines of code.
Inmark developers and technical support are on hand to give assistance. In addition, every attendee has a fully loaded computer at their exclusive disposal. The course is strictly limited to 20 attendees to ensure individual attention.
The cost of the seminar is $1495 per person. Discounts are available for groups of 4 or more. Click here for a list of upcoming dates. For more information, contact your sales representative at 800-346-6275.
On-site training is also available through Inmark. For more information, contact your sales representative at 800-346-6275.
© Copyright 1995, Rogue Wave Software, Inc.